Attention really is all you need โ€” The Encoder
pub.towardsai.netยท1h
๐Ÿค–Transformers
Flag this post
Domain-specific Languages and Code Synthesis Using Haskell
queue.acm.orgยท2dยท
๐Ÿ“Type Theory
Flag this post
Generative AI and the P=NP problem
lesswrong.comยท14h
๐ŸงฎSMT Solvers
Flag this post
Omnidirectional type inference for ML: principality any way
arxiv.orgยท2d
๐ŸŽฏHindley-Milner
Flag this post
Position-Candidate-Hypothesis (PCH) Paradigm: A New Research Direction for NP-Complete Problems
dev.toยท23hยท
Discuss: DEV
๐ŸงฎSMT Solvers
Flag this post
How do you guys recommend learning Zig for beginners?
pedropark99.github.ioยท1hยท
Discuss: r/Zig
โš™๏ธZig
Flag this post
I'm building a language that compiles Haskell-style Monads and RAII down to high-performance C. I call it Cicili
github.comยท1dยท
ฮปFunctional Programming
Flag this post
ML Systems Textbook by Havard
mlsysbook.aiยท5hยท
Discuss: Hacker News
๐Ÿš€MLOps
Flag this post
GNN From Scratch
cultured-avenue-f13.notion.siteยท7hยท
Discuss: r/programming
๐Ÿ•ธ๏ธGraphBLAS
Flag this post
Speculative Decoding: Making LLMs Faster Without Sacrificing Quality
dev.toยท22hยท
Discuss: DEV
๐Ÿ’ฌPrompt Engineering
Flag this post
Reproachfully Presenting Resilient Recursive Descent Parsing
thunderseethe.devยท4dยท
๐Ÿ“Parsing
Flag this post
The โ€œJankiestโ€ way of writing Ruby gems
mauricio.szabo.linkยท2h
๐Ÿ”ตClojure
Flag this post
Smoothsort Demystified
keithschwarz.comยท1dยท
โšกQuicksort
Flag this post
Structured Output Generation in LLMs: JSON Schema and Grammar-Based Decoding
pub.towardsai.netยท1d
๐ŸŒณTree-sitter
Flag this post
Understanding neural networks through sparse circuits
openai.comยท2dยท
Discuss: Hacker News
๐Ÿ“ฑEdge AI
Flag this post
Beyond Quacking: Deep Integration of Language Models and RAG into DuckDB
vldb.orgยท1dยท
Discuss: Hacker News
๐Ÿ”ฅDataFusion
Flag this post
Refactoring Legacy: Part 1 - DTO's & Value Objects
clegginabox.co.ukยท10hยท
Discuss: r/programming
๐ŸŽจAPI Design
Flag this post
Solving Project Euler #45
loriculus.orgยท6hยท
Discuss: Hacker News
๐Ÿ“ŠDynamic Programming
Flag this post
A Deep Dive into Self-Attention and Multi-Head Attention in Transformers
medium.comยท18hยท
Discuss: r/LocalLLaMA
๐Ÿค–Transformers
Flag this post